103 research outputs found

    Simulated maximum likelihood for general stochastic volatility models: a change of variable approach

    Get PDF
    Maximum likelihood has proved to be a valuable tool for fitting the log-normal stochastic volatility model to financial returns time series. Using a sequential change of variable framework, we are able to cast more general stochastic volatility models into a form appropriate for importance samplers based on the Laplace approximation. We apply the methodology to two example models, showing that efficient importance samplers can be constructed even for highly non-Gaussian latent processes such as square-root diffusions.Change of Variable; Heston Model; Laplace Importance Sampler; Simulated Maximum Likelihood; Stochastic Volatility

    Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time

    Get PDF

    Simulated Maximum Likelihood Estimation for Latent Diffusion Models

    Get PDF
    In this paper a method is developed and implemented to provide the simulated maximum likelihood estimation of latent diffusions based on discrete data. The method is applicable to diffusions that either have latent elements in the state vector or are only observed at discrete time with a noise. Latent diffusions are very important in practical applications in nancial economics. The proposed approach synthesizes the closed form method of Aït-Sahalia (2008) and the ecient importance sampler of Richard and Zhang (2007). It does not require any inll observations to be introduced and hence is computationally tractable. The Monte Carlo study shows that the method works well in finite sample. The empirical applications illustrate usefulness of the method and find no evidence of infinite variance in the importance sampler.Closed-form approximation; Diusion Model; Ecient importance sampler

    Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time

    Get PDF
    A new algorithm is developed to provide a simulated maximum likelihood estimation of the GARCH diffusion model of Nelson (1990) based on return data only. The method combines two accurate approximation procedures, namely, the polynomial expansion of Aït-Sahalia (2008) to approximate the transition probability density of return and volatility, and the Efficient Importance Sampler (EIS) of Richard and Zhang (2007) to integrate out the volatility. The first and second order terms in the polynomial expansion are used to generate a base-line importance density for an EIS algorithm. The higher order terms are included when evaluating the importance weights. Monte Carlo experiments show that the new method works well and the discretization error is well controlled by the polynomial expansion. In the empirical application, we fit the GARCH diffusion to equity data, perform diagnostics on the model fit, and test the finiteness of the importance weights.Ecient importance sampling; GARCH diusion model; Simulated Maximum likelihood; Stochastic volatility

    Stimulated Maximum Likelihood Estimation of Continuous Time Stochastic Volatility Models

    Get PDF
    In this paper we develop and implement a method for maximum simulated likelihood estimation of the continuous time stochastic volatility model with the constant elasticity of volatility. The approach do not require observations on option prices nor volatility. To integrate out latent volatility from the joint density of return and volatility, a modified efficient importance sampling technique is used after the continuous time model is approximated using the Euler-Maruyama scheme. The Monte Carlo studies show that the method works well and the empirical applications illustrate usefulness of the method. Empirical results provide strong evidence against the Heston model.Efficient importance sampler; Constant elasticity of volatility

    Experimental design for parameter estimation in steady-state linear models of metabolic networks

    Get PDF
    Metabolic networks are typically large, containing many metabolites and reactions. Dynamical models that aim to simulate such networks will consist of a large number of ordinary differential equations, with many kinetic parameters that must be estimated from experimental data. We assume these data to be metabolomics measurements made under steady-state conditions for different input fluxes. Assuming linear kinetics, analytical criteria for parameter identifiability are provided. For normally distributed error terms, we also calculate the Fisher information matrix analytically to be used in the D-optimality criterion. A test network illustrates the developed tool chain for finding an optimal experimental design. The first stage is to verify global or pointwise parameter identifiability, the second stage to find optimal input fluxes, and finally remove redundant measurements.publishedVersio

    A flexible and automated likelihood based framework for inference in stochastic volatility models

    Get PDF
    Ministry of Education, Singapore under its Academic Research Funding Tier

    Simulated maximum likelihood for general stochastic volatility models: a change of variable approach

    Get PDF
    Maximum likelihood has proved to be a valuable tool for fitting the log-normal stochastic volatility model to financial returns time series. Using a sequential change of variable framework, we are able to cast more general stochastic volatility models into a form appropriate for importance samplers based on the Laplace approximation. We apply the methodology to two example models, showing that efficient importance samplers can be constructed even for highly non-Gaussian latent processes such as square-root diffusions
    • …
    corecore